翻訳と辞書
Words near each other
・ Entropia
・ Entropia (album)
・ Entropia Universe
・ Entropia, Inc. (company)
・ Entropic Communications
・ Entropic explosion
・ Entropic force
・ Entropic gravity
・ Entropic risk measure
・ Entropic security
・ Entropic uncertainty
・ Entropic value at risk
・ Entropic vector
・ EntropiK
・ Entropion
Entropy
・ Entropy (1977 board game)
・ Entropy (1994 board game)
・ Entropy (album)
・ Entropy (anonymous data store)
・ Entropy (arrow of time)
・ Entropy (astrophysics)
・ Entropy (Buffy the Vampire Slayer)
・ Entropy (classical thermodynamics)
・ Entropy (comics)
・ Entropy (computing)
・ Entropy (disambiguation)
・ Entropy (energy dispersal)
・ Entropy (film)
・ Entropy (Hip Hop Reconstruction from the Ground Up)


Dictionary Lists
翻訳と辞書 辞書検索 [ 開発暫定版 ]
スポンサード リンク

Entropy : ウィキペディア英語版
Entropy

In thermodynamics, entropy (usual symbol ''S'') is a measure of the number of specific ways in which a thermodynamic system may be arranged, commonly understood as a measure of disorder. According to the second law of thermodynamics the entropy of an isolated system never decreases; such a system will spontaneously proceed towards thermodynamic equilibrium, the configuration with maximum entropy. Systems that are not isolated may decrease in entropy, provided they increase the entropy of their environment by at least that same amount. Since entropy is a state function, the change in the entropy of a system is the same for any process that goes from a given initial state to a given final state, whether the process is reversible or irreversible. However, irreversible processes increase the combined entropy of the system and its environment.
The change in entropy (Δ''S'') of a system was originally defined for a thermodynamically reversible process as
:\Delta S = \int \frac{dQ_\text{rev}}T,
where is the absolute temperature of the system, dividing an incremental reversible transfer of heat into that system (). (If heat is transferred out the sign would be reversed giving a decrease in entropy of the system.) The above definition is sometimes called the macroscopic definition of entropy because it can be used without regard to any microscopic description of the contents of a system. The concept of entropy has been found to be generally useful and has several other formulations. Entropy was discovered when it was noticed to be a quantity that behaves as a function of state, as a consequence of the second law of thermodynamics.
Entropy is an extensive property. It has the dimension of energy divided by temperature, which has a unit of joules per kelvin (J K−1) in the International System of Units (or kg m2 s−2 K−1 in terms of base units). But the entropy of a pure substance is usually given as an intensive property — either entropy per unit mass (SI unit: J K−1 kg−1) or entropy per unit amount of substance (SI unit: J K−1 mol−1).
The ''absolute'' entropy (''S'' rather than Δ''S'') was defined later, using either statistical mechanics or the third law of thermodynamics.
In the modern microscopic interpretation of entropy in statistical mechanics, entropy is the amount of additional information needed to specify the exact physical state of a system, given its thermodynamic specification. Understanding the role of thermodynamic entropy in various processes requires an understanding of how and why that information changes as the system evolves from its initial to its final condition. It is often said that entropy is an expression of the disorder, or randomness of a system, or of our lack of information about it. The second law is now often seen as an expression of the fundamental postulate of statistical mechanics through the modern definition of entropy.
==History==
, deduced from the heat-friction experiments of James Joule in 1843, expresses the concept of energy, and its conservation in all processes; the first law, however, is unable to quantify the effects of friction and dissipation.
In the 1850s and 1860s, German physicist Rudolf Clausius objected to the supposition that no change occurs in the working body, and gave this "change" a mathematical interpretation by questioning the nature of the inherent loss of usable heat when work is done, e.g. heat produced by friction. Clausius described entropy as the ''transformation-content'', i.e. dissipative energy use, of a thermodynamic system or working body of chemical species during a change of state.〔 This was in contrast to earlier views, based on the theories of Isaac Newton, that heat was an indestructible particle that had mass.
Later, scientists such as Ludwig Boltzmann, Josiah Willard Gibbs, and James Clerk Maxwell gave entropy a statistical basis. In 1877 Boltzmann visualized a probabilistic way to measure the entropy of an ensemble of ideal gas particles, in which he defined entropy to be proportional to the logarithm of the number of microstates such a gas could occupy. Henceforth, the essential problem in statistical thermodynamics, i.e. according to Erwin Schrödinger, has been to determine the distribution of a given amount of energy E over N identical systems.
Carathéodory linked entropy with a mathematical definition of irreversibility, in terms of trajectories and integrability.

抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)
ウィキペディアで「Entropy」の詳細全文を読む



スポンサード リンク
翻訳と辞書 : 翻訳のためのインターネットリソース

Copyright(C) kotoba.ne.jp 1997-2016. All Rights Reserved.